Metric Learning across Heterogeneous Domains by Respectively Aligning Both Priors and Posteriors

نویسندگان

  • Qiang Qian
  • Songcan Chen
چکیده

In this paper, we attempts to learn a single metric across two heterogeneous domains where source domain is fully labeled and has many samples while target domain has only a few labeled samples but abundant unlabeled samples. To the best of our knowledge, this task is seldom touched. The proposed learning model has a simple underlying motivation: all the samples in both the source and the target domains are mapped into a common space, where both their priors P (sample)s and their posteriors P (label|sample)s are forced to be respectively aligned as much as possible. We show that the two mappings, from both the source domain and the target domain to the common space, can be reparameterized into a single positive semi-definite(PSD) matrix. Then we develop an efficient Bregman Projection algorithm to optimize the PDS matrix over which a LogDet function is used to regularize. Furthermore, we also show that this model can be easily kernelized and verify its effectiveness in crosslanguage retrieval task and cross-domain object recognition task.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Combining Side Information and Unlabeled Data for Heterogeneous Multi-Task Metric Learning

Distance metric learning (DML) is critical for a wide variety of machine learning algorithms and pattern recognition applications. Transfer metric learning (TML) leverages the side information (e.g., similar/dissimilar constraints over pairs of samples) from related domains to help the target metric learning (with limited information). Current TML tools usually assume that different domains exp...

متن کامل

Cross-Domain Metric Learning Based on Information Theory

Supervised metric learning plays a substantial role in statistical classification. Conventional metric learning algorithms have limited utility when the training data and testing data are drawn from related but different domains (i.e., source domain and target domain). Although this issue has got some progress in feature-based transfer learning, most of the work in this area suffers from non-tr...

متن کامل

A Variational Baysian Framework for Graphical Models

This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model parameters and structures, as well as latent variables, in an analytical manner. These posteriors fall out of a free-form optimization procedure, which naturally incorporates conjugate priors. Unlike...

متن کامل

A Variational Bayesian Framework for Graphical Models

This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model parameters and structures, as well as latent variables, in an analytical manner. These posteriors fall out of a free-form optimization procedure, which naturally incorporates conjugate priors. Unlike...

متن کامل

Heterogeneous Unsupervised Cross-domain Transfer Learning

Transfer learning leverages the knowledge in one domain – the source domain – to improve learning efficiency in another domain – the target domain. Existing transfer learning research is relatively well-progressed, but only in situations where the feature spaces of the domains are homogeneous and the target domain contains at least a few labeled instances. However, transfer learning has not bee...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1208.1829  شماره 

صفحات  -

تاریخ انتشار 2012